Efficient Rank-one Residue Approximation Method for Graph Regularized Non-negative Matrix Factorization
نویسندگان
چکیده
Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lower-rank nonnegative factor matrices UV T . Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X during such decomposition. Although GNMF has been widely used in computer vision and data mining, its multiplicative update rule (MUR) based solver suffers from both slow convergence and non-stationarity problems. In this paper, we propose a new efficient GNMF solver called rank-one residue approximation (RRA). Different from MUR, which updates both factor matrices (U and V ) as a whole in each iteration round, RRA updates each of their columns by approximating the residue matrix by their outer product. Since each column of both factor matrices is updated optimally in an analytic formulation, RRA is theoretical and empirically proven to converge rapidly to a stationary point. Moreover, since RRA needs neither extra computational cost nor parametric tuning, it enjoys a similar simplicity to MUR but performs much faster. Experimental results on real-world datasets show that RRA is much more efficient than MUR for GNMF. To confirm the stationarity of the solution obtained by RRA, we conduct clustering experiments on real-world image datasets by comparing with the representative solvers such as MUR and NeNMF for GNMF. The experimental results confirm the effectiveness of RRA.
منابع مشابه
Descent methods for Nonnegative Matrix Factorization
In this paper, we present several descent methods that can be applied to nonnegative matrix factorization and we analyze a recently developped fast block coordinate method called Rank-one Residue Iteration (RRI). We also give a comparison of these different methods and show that the new block coordinate method has better properties in terms of approximation error and complexity. By interpreting...
متن کاملMahNMF: Manhattan Non-negative Matrix Factorization
Non-negative matrix factorization (NMF) approximates a non-negative matrix X by a product of two non-negative low-rank factor matrices W and H . NMF and its extensions minimize either the Kullback-Leibler divergence or the Euclidean distance between X and WH to model the Poisson noise or the Gaussian noise. In practice, when the noise distribution is heavy tailed, they cannot perform well. This...
متن کاملFe b 20 08 Descent methods for Nonnegative Matrix
In this paper, we present several descent methods that can be applied to nonnegative matrix factorization and we analyze a recently developed fast block coordinate method. We also give a comparison of these different methods and show that the new block coordinate method has better properties in terms of approximation error and complexity. By interpreting this method as a rank-one approximation ...
متن کاملNon-Negative Matrix Factorization with Sinkhorn Distance
Non-negative Matrix Factorization (NMF) has received considerable attentions in various areas for its psychological and physiological interpretation of naturally occurring data whose representation may be parts-based in the human brain. Despite its good practical performance, one shortcoming of original NMF is that it ignores intrinsic structure of data set. On one hand, samples might be on a m...
متن کاملLearning manifold to regularize nonnegative matrix factorization
In this chapter we discuss how to learn an optimal manifold presentation to regularize nonegative matrix factorization (NMF) for data representation problems. NMF, which tries to represent a nonnegative data matrix as a product of two low rank nonnegative matrices, has been a popular method for data representation due to its ability to explore the latent part-based structure of data. Recent stu...
متن کامل